# ROUGE Evaluation

Mt5 Base Cnn Then Norsumm
A text summarization model trained based on the mt5-base architecture, capable of extracting key information from text to generate summaries.
Text Generation Transformers
M
GloriaABK1
105
0
Khmer Mt5 Summarization 1024tk V2
Apache-2.0
An improved Khmer text summarization model based on mT5-small, supporting inputs of up to 1024 tokens, suitable for summarizing Khmer articles, paragraphs, or documents.
Text Generation Transformers Other
K
songhieng
16
1
News Sum Tr
A Turkish news text summarization model based on the mT5 architecture, trained on a dataset of Turkish economic and current affairs news, capable of generating core content summaries of news texts.
Text Generation Transformers Other
N
nebiberke
30
3
Model Financial Documents 3
Apache-2.0
A financial document summarization model fine-tuned from T5-small, trained on the searde/dataset-financial-documents-3 dataset
Text Generation Transformers
M
searde
15
2
Bart Base Multi News
Apache-2.0
A text summarization generation model fine-tuned on the multi_news dataset based on facebook/bart-base
Text Generation Transformers English
B
Ssarion
15
0
Multi News Diff Weight
Apache-2.0
This model is a text summarization model fine-tuned on the multi_news dataset based on facebook/bart-base, specifically designed for multi-document summarization tasks.
Text Generation Transformers
M
cs608
15
0
Article2kw Test1.1 Barthez Orangesum Title Finetuned For Summerization
Apache-2.0
A text summarization model fine-tuned based on barthez-orangesum-title, specializing in generating headline-style summaries
Text Generation Transformers
A
bthomas
28
0
Article2kw Test1 Barthez Orangesum Title Finetuned For Summurization
Apache-2.0
A fine-tuned abstract generation model based on barthez-orangesum-title, trained on an unknown dataset, supporting French text summarization tasks.
Text Generation Transformers
A
bthomas
19
0
Nonsenseupdatediffstringbart
This is a pre-trained model based on the BART architecture, primarily used for summary generation and difference comparison tasks.
Text Generation Transformers English
N
hyesunyun
19
0
Mt5 Base Wikinewssum English 100
Apache-2.0
An English summarization model fine-tuned from google/mt5-base on the wikinews dataset, suitable for news summarization tasks.
Text Generation Transformers
M
airKlizz
14
0
MEETING SUMMARY BART LARGE XSUM SAMSUM DIALOGSUM AMI
Apache-2.0
A sequence-to-sequence model based on the BART architecture, specifically fine-tuned for meeting and dialogue summarization tasks, capable of generating abstractive summaries from various dialogue data.
Text Generation Transformers English
M
knkarthick
119
15
T5 Podcast Summarisation
This is an automatic podcast summary generation model fine-tuned based on the T5-base model, trained using the Spotify podcast dataset.
Text Generation Transformers English
T
paulowoicho
64
9
T5 CNN
This model is a summarization generation model trained on the CNN/Daily Mail dataset, primarily used for generating concise summaries of news articles.
Text Generation Transformers
T
MohamedZaitoon
14
0
Bart Fine Tune
A summarization model trained on the CNN/Daily Mail dataset, capable of automatically generating concise summaries of news articles.
Text Generation
B
MohamedZaitoon
23
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase